Maximum Entropy and Maximum Probability

نویسندگان

  • Marian Grendar
  • Robert K. Niven
چکیده

Sanov’s Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Pólya Eggenberger urn sampling scheme, giving the Pólya divergence and the Pólya extension to the Maximum Relative Entropy (MaxEnt) method. Pólya MaxEnt includes the standard MaxEnt as a special case. The universality of standard MaxEnt advocated by an axiomatic approach to inference for inverse problems is challenged, in favor of a probabilistic approach based on CoLT and the Maximum Probability principle.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Determination of Maximum Bayesian Entropy Probability Distribution

In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.

متن کامل

Modeling of the Maximum Entropy Problem as an Optimal Control Problem and its Application to Pdf Estimation of Electricity Price

In this paper, the continuous optimal control theory is used to model and solve the maximum entropy problem for a continuous random variable. The maximum entropy principle provides a method to obtain least-biased probability density function (Pdf) estimation. In this paper, to find a closed form solution for the maximum entropy problem with any number of moment constraints, the entropy is consi...

متن کامل

Maximum Probability and Relative Entropy Maximization. Bayesian Maximum Probability and Empirical Likelihood

Works, briefly surveyed here, are concerned with two basic methods: Maximum Probability and Bayesian Maximum Probability; as well as with their asymptotic instances: Relative Entropy Maximization and Maximum Non-parametric Likelihood. Parametric and empirical extensions of the latter methods – Empirical Maximum Maximum Entropy and Empirical Likelihood – are also mentioned. The methods are viewe...

متن کامل

Maximum Probability and Maximum Entropy methods: Bayesian interpretation

(Jaynes’) Method of (Shannon-Kullback’s) Relative Entropy Maximization (REM or MaxEnt) can be at least in the discrete case according to the Maximum Probability Theorem (MPT) viewed as an asymptotic instance of the Maximum Probability method (MaxProb). A simple bayesian interpretation of MaxProb is given here. MPT carries the interpretation over into REM.

متن کامل

Quasi-continuous maximum entropy distribution approximation with kernel density

This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a nonparametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In a second step it is shown, how boundary conditions can be included...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006